648 research outputs found

    Adaptive Bayesian and frequentist data processing for quantum tomography

    Full text link
    The outcome statistics of an informationally complete quantum measurement for a system in a given state can be used to evaluate the ensemble expectation of any linear operator in the same state, by averaging a function of the outcomes that depends on the specific operator. Here we introduce two novel data-processing strategies, non-linear in the frequencies, which lead to faster convergence to theoretical expectations.Comment: 12 pages, 2 figures, revised versio

    Differences in the carcinogenic evaluation of glyphosate between the International Agency for Research on Cancer (IARC) and the European Food Safety Authority (EFSA)

    Get PDF
    The International Agency for Research on Cancer (IARC) Monographs Programme identifies chemicals, drugs, mixtures, occupational exposures, lifestyles and personal habits, and physical and biological agents that cause cancer in humans and has evaluated about 1000 agents since 1971. Monographs are written by ad hoc Working Groups (WGs) of international scientific experts over a period of about 12 months ending in an eight-day meeting. The WG evaluates all of the publicly available scientific information on each substance and, through a transparent and rigorous process,1 decides on the degree to which the scientific evidence supports that substance's potential to cause or not cause cancer in humans. For Monograph 112,2 17 expert scientists evaluated the carcinogenic hazard for four insecticides and the herbicide glyphosate.3 The WG concluded that the data for glyphosate meet the criteria for classification as a probable human carcinogen. The European Food Safety Authority (EFSA) is the primary agency of the European Union for risk assessments regarding food safety. In October 2015, EFSA reported4 on their evaluation of the Renewal Assessment Report5 (RAR) for glyphosate that was prepared by the Rapporteur Member State, the German Federal Institute for Risk Assessment (BfR). EFSA concluded that ?glyphosate is unlikely to pose a carcinogenic hazard to humans and the evidence does not support classification with regard to its carcinogenic potential?. Addendum 1 (the BfR Addendum) of the RAR5 discusses the scientific rationale for differing from the IARC WG conclusion. Serious flaws in the scientific evaluation in the RAR incorrectly characterise the potential for a carcinogenic hazard from exposure to glyphosate. Since the RAR is the basis for the European Food Safety Agency (EFSA) conclusion,4 it is critical that these shortcomings are corrected

    Understanding Factors Associated With Psychomotor Subtypes of Delirium in Older Inpatients With Dementia

    Get PDF

    Search for massive resonances decaying in to WW,WZ or ZZ bosons in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe

    Search for long-lived charged particles in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe

    Measurement of prompt and nonprompt J/psi production in pp and pPb collisions at root s(NN)=5.02 TeV

    Get PDF
    Peer reviewe

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Search for heavy resonances decaying to two Higgs bosons in final states containing four b quarks

    Get PDF
    A search is presented for narrow heavy resonances X decaying into pairs of Higgs bosons (H) in proton-proton collisions collected by the CMS experiment at the LHC at root s = 8 TeV. The data correspond to an integrated luminosity of 19.7 fb(-1). The search considers HH resonances with masses between 1 and 3 TeV, having final states of two b quark pairs. Each Higgs boson is produced with large momentum, and the hadronization products of the pair of b quarks can usually be reconstructed as single large jets. The background from multijet and t (t) over bar events is significantly reduced by applying requirements related to the flavor of the jet, its mass, and its substructure. The signal would be identified as a peak on top of the dijet invariant mass spectrum of the remaining background events. No evidence is observed for such a signal. Upper limits obtained at 95 confidence level for the product of the production cross section and branching fraction sigma(gg -> X) B(X -> HH -> b (b) over barb (b) over bar) range from 10 to 1.5 fb for the mass of X from 1.15 to 2.0 TeV, significantly extending previous searches. For a warped extra dimension theory with amass scale Lambda(R) = 1 TeV, the data exclude radion scalar masses between 1.15 and 1.55 TeV

    Bose-Einstein correlations of charged hadrons in proton-proton collisions at s\sqrt s = 13 TeV

    Get PDF
    Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s \sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s \sqrt{s} = 7 TeV, as well as with theoretical predictions.[graphic not available: see fulltext]Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s=\sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s=\sqrt{s} = 7 TeV, as well as with theoretical predictions

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe
    corecore